Finding the M Most Probable Configurations using Loopy Belief Propagation
نویسنده
چکیده
Loopy belief propagation (BP) has been successfully used in a number of difficult graphical models to find the most probable configuration of the hidden variables. In applications ranging from protein folding to image analysis one would like to find not just the best configuration but rather the top M . While this problem has been solved using the junction tree formalism, in many real world problems the clique size in the junction tree is prohibitively large. In this work we address the problem of finding the M best configurations when exact inference is impossible. We start by developing a new exact inference algorithm for calculating the best configurations that uses only max-marginals. For approximate inference, we replace the max-marginals with the beliefs calculated using max-product BP and generalized BP. We show empirically that the algorithm can accurately and rapidly approximate the M best configurations in graphs with hundreds of variables.
منابع مشابه
Statistical physics of loopy interactions: Independent-loop approximation and beyond
We consider an interacting system of spin variables on a loopy interaction graph, identified by a tree graph and a set of loopy interactions. We start from a high-temperature expansion for loopy interactions represented by a sum of non-negative contributions from all the possible frustration-free loop configurations. We then compute the loop corrections using different approximations for the no...
متن کاملCorrectness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology
Graphical models, such as Bayesian networks and Markov random elds represent statistical dependencies of variables by a graph. Local \belief propagation" rules of the sort proposed by Pearl [20] are guaranteed to converge to the correct posterior probabilities in singly connected graphs. Recently good performance has been obtained by using these same rules on graphs with loops, a method known a...
متن کاملLoopy belief propagation for approximate inference : an empiricalstudyKevin
Recently, a number of researchers have demonstrated excellent performance by using \loopy belief propagation" | using Pearl's polytree algorithm in a Bayesian network with loops. The most dramatic instance is the near Shannon-limit performance of \Turbo Codes" | error-correcting codes whose decoding algorithm is equivalent to loopy belief propagation. In this paper we ask: is there something sp...
متن کاملImproved sampling using loopy belief propagation for probabilistic model building genetic programming
In recent years, probabilistic model building genetic programming (PMBGP) for program optimization has attracted considerable interest. PMBGPs generally use probabilistic logic sampling (PLS) to generate new individuals. However, the generation of the most probable solutions (MPSs), i.e., solutions with the highest probability, is not guaranteed. In the present paper, we introduce loopy belief ...
متن کاملLearning How to Inpaint from Global Image Statistics
Inpainting is the problem of filling-in holes in images. Considerable progress has been made by techniques that use the immediate boundary of the hole and some prior information on images to solve this problem. These algorithms successfully solve the local inpainting problem but they must, by definition, give the same completion to any two holes that have the same boundary, even when the rest o...
متن کامل